Skip to content

Conversation

@rvandam
Copy link

@rvandam rvandam commented Jun 21, 2017

Don't permanently cache the current plan until the first test, in case there are any leading comments (meaning we don't yet have a plan so caching it might be premature).

Don't permanently cache the current plan until the first test, in case there are any leading comments (meaning we don't yet have a plan so caching it might be premature).
@Leont
Copy link
Member

Leont commented Jul 17, 2017

I think this fix is suboptimal. It shouldn't depend on having seen any tests, only the plan should matter.

@rvandam
Copy link
Author

rvandam commented Jul 17, 2017

I chose that as a compromise over just removing the caching altogether. Without the caching, then the non-verbose output would switch from the X/? to X/Y output whenever a plan 1..Y was encountered, even if that was somewhere in the middle of the TAP output. I wanted to limit the scope in which the plan is used to be up to when the first test is encountered (so ignore comments) base on this portion of the TAP spec:

The plan is optional but if there is a plan before the test points it must be the first non-diagnostic line output by the test file. In certain instances a test file may not know how many test points it will ultimately be running. In this case the plan can be the last non-diagnostic line in the output. The plan cannot appear in the middle of the output, nor can it appear more than once.

Of course, this is in a formatter, not in a parser so perhaps it doesn't matter what the spec says at this point and the formatter should just roll with whatever the parser gives it, even if that is in violation of the spec? In that case, I would say that removing the local caching is the best solution, unless there's a concern that some other Formatter that inherits from this one has somehow turned the lookup of the plan into an expensive operation (but then that formatter should probably handle its own caching IMO).

@Leont Leont self-assigned this May 4, 2018
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment

Labels

None yet

Projects

None yet

Development

Successfully merging this pull request may close these issues.

2 participants